fix(blocks): resolve Ollama models incorrectly requiring API key in Docker#3976
fix(blocks): resolve Ollama models incorrectly requiring API key in Docker#3976waleedlatif1 merged 4 commits intostagingfrom
Conversation
…ocker Server-side validation failed for Ollama models like mistral:latest because the Zustand providers store is empty on the server and getProviderFromModel misidentified them via regex pattern matching (e.g. mistral:latest matched Mistral AI's /^mistral/ pattern). Replace the hardcoded CLOUD_PROVIDER_PREFIXES list with existing data sources: - Provider store (definitive on client, checks all provider buckets) - getBaseModelProviders() from PROVIDER_DEFINITIONS (server-side static cloud model lookup) - Slash convention for dynamic cloud providers (fireworks/, openrouter/, etc.) - isOllamaConfigured feature flag using existing OLLAMA_URL env var Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|
The latest updates on your projects. Learn more about Vercel for GitHub. |
PR SummaryMedium Risk Overview Reworks provider detection to prefer the client Reviewed by Cursor Bugbot for commit 8225115. Configure here. |
Greptile SummaryThis PR fixes a Docker deployment regression where Ollama models (e.g. Key changes:
Correctness notes:
Confidence Score: 5/5Safe to merge — the fix is targeted, well-tested, and the logic is sound All prior P1/P2 concerns from previous review rounds have been addressed and explained by the author. No new P0 or P1 issues found. The three-layer resolution strategy (store → isOllamaConfigured+explicit map → require key) correctly handles all provider paths including the regression case. No files require special attention Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
A[shouldRequireApiKeyForModel] --> B{model empty?}
B -- yes --> C[return false]
B -- no --> D{isHosted?}
D -- yes --> E{in hostedModels?}
E -- yes --> C
E -- no --> F{vertex/ or bedrock/?}
D -- no --> F
F -- yes --> C
F -- no --> G{isAzureConfigured\nAND azure prefix/model?}
G -- yes --> C
G -- no --> H{vllm/ prefix?}
H -- yes --> C
H -- no --> I[getProviderFromStore]
I --> J{storeProvider?}
J -- ollama or vllm --> C
J -- other provider --> K[return true]
J -- null --> L{isOllamaConfigured?}
L -- no --> K
L -- yes --> M{has slash?}
M -- yes --> K
M -- no --> N{in getBaseModelProviders?}
N -- yes --> K
N -- no --> O[return false\nOllama model assumed]
Reviews (5): Last reviewed commit: "fix: handle vLLM models in store provide..." | Re-trigger Greptile |
|
@greptile |
|
@cursor review |
There was a problem hiding this comment.
✅ Bugbot reviewed your changes and found no new issues!
Comment @cursor review or bugbot run to trigger another review on this PR
Reviewed by Cursor Bugbot for commit 3f855a1. Configure here.
…idation The fallback was the last piece of regex-based matching in the function and only ran for self-hosted without OLLAMA_URL on the server — a path where Ollama models cannot appear in the dropdown anyway. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|
@greptile |
|
@cursor review |
There was a problem hiding this comment.
✅ Bugbot reviewed your changes and found no new issues!
Comment @cursor review or bugbot run to trigger another review on this PR
Reviewed by Cursor Bugbot for commit adfdf8f. Configure here.
vLLM is a local model server like Ollama and should not require an API key. Add vllm to the store provider check as a safety net for models that may not have the vllm/ prefix. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|
@greptile |
|
@cursor review |
There was a problem hiding this comment.
✅ Bugbot reviewed your changes and found no new issues!
Comment @cursor review or bugbot run to trigger another review on this PR
Reviewed by Cursor Bugbot for commit 8225115. Configure here.
|
@greptile |
|
@cursor review |
There was a problem hiding this comment.
✅ Bugbot reviewed your changes and found no new issues!
Comment @cursor review or bugbot run to trigger another review on this PR
Reviewed by Cursor Bugbot for commit 8225115. Configure here.
Summary
mistral:latestfailed server-side validation with "Agent is missing required fields: API Key" in Docker deployments because the Zustand providers store is empty on the server andgetProviderFromModelmisidentified them via regex pattern matchingCLOUD_PROVIDER_PREFIXESlist with existing data sources: the providers store (client-side),getBaseModelProviders()fromPROVIDER_DEFINITIONS(server-side), and the slash convention for dynamic cloud providersisOllamaConfiguredfeature flag using the existingOLLAMA_URLenv varFixes #3974
Test plan
shouldRequireApiKeyForModelmistral:latestwithOLLAMA_URLset does not require API keygpt-4o,claude-sonnet-4-5) correctly require API key even withOLLAMA_URLsetOLLAMA_URLset, run workflow with Ollama model